R Enyi's Entropy Rate for Discrete Markov Sources

نویسندگان

  • Ziad Rached
  • Fady Alajaji
چکیده

In this work, we extend a variable-length source coding theorem for discrete memoryless sources to ergodic time-invariant Markov sources of arbitrary order. To accomplish this extension, we establish a formula for the R enyi entropy rate lim n!1 H (n)=n. The main tool used to obtain the R enyi entropy rate result is Perron-Frobenius theory. We also examine the expression of the R enyi entropy rate for speci c examples of Markov sources and investigate its limit as ! 1 and as ! 0. Finally, we conclude with numerical examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rate of Rényi Entropy for Irreducible Markov Chains

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

متن کامل

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

Enyi ' S Entropy Rate for Discrete Markov

In this work, we extend a variable-length source coding theorem for discrete memoryless sources to ergodic time-invariant Markov sources of arbitrary order. To accomplish this extension, we establish a formula for the R enyi entropy rate limn!1 H(n)=n. The main tool used to obtain the R enyi entropy rate result is Perron-Frobenius theory. We also examine the expression of the R enyi en-tropy ra...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

ENTROPY FOR DTMC SIS EPIDEMIC MODEL

In this paper at rst, a history of mathematical models is given.Next, some basic information about random variables, stochastic processesand Markov chains is introduced. As follows, the entropy for a discrete timeMarkov process is mentioned. After that, the entropy for SIS stochastic modelsis computed, and it is proved that an epidemic will be disappeared after a longtime.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017